2 - Project Representation Learning 2022 [ID:44315]
50 von 456 angezeigt

Hello everybody and welcome to project representation learning, this term for winter, term 2022.

So my name is Bernda Krenz.

I'm leading this project together with Johanna Müller, Misha Dombrovski and Luca Schmidtke,

who will support us in this project.

Now what is this project about?

Well, essentially about representation learning and there is quite some debate about what

is actually representation learning.

Representation learning has a lot of definitions, but right now you could probably claim that

all of data science is basically representation learning.

You need to represent data in a maximally informative way.

There has always been some notion about representation learning, like feature learning, very relevant

for high dimensional data like images, audio or any other high dimensional data that you

need to break down to something meaningful and informative for your following tasks.

There's an interesting paper I recommend you to read, the few of these people on representation

learning.

It's an interesting read, it gives you an overview of the area and what's currently

happening.

It's not introducing new methods or anything, it's really just a perspective.

Now there are some definitions like data representation is a formal system which makes explicit certain

entities and types of information, means it is able to filter out irrelevant information

and only keep the relevant information for a specific task.

Well, and this representation can be operated on by an algorithm in order to achieve some

information processing goal, something like classifying cats from dogs or doing medical

diagnosis maybe.

But a representation is different in terms of what information they make explicit and

in terms of what algorithms they support.

So each representation can be very different.

There is a slightly different stream as well going on in the community which is going towards

building foundation models, representations that support any sort of downstream task and

can be refined to any sort of downstream task.

But this is sub area of representation learning itself.

Now in this project, your learning outcomes is really to learn how to do independent scientific

working, something that prepares you for your master's thesis.

Maybe some of you might go on into academic careers, but also in industry it's very important

to be able to follow a scientific working approach.

And scientific working mainly means that you don't make any claims out of thin air.

All of your claims, all of your statements are either hypothesis driven and or evidence

supported so that you can always have your claims and arguments on a solid basis.

You'll also hopefully learn good scientific writing skills.

So writing is one of our most important skills in this trade.

And well, what is good scientific writing?

It's language, good language, a kind of condensed style so that you have a large information

density, not too much text.

Another learning objective is of course to understand and reproduce representation learning.

In this project, we'll have a lot of computer vision topics.

Feel free to also provide and suggest other topics.

But most of the work currently is really most impactful in the domain of computer vision.

So our data points are usually images and the labels are whatever the labels might be,

either cats and dogs or text maybe or other images even.

And what really a key learning objective here is this well, which I think is one of the

Teil einer Videoserie :

Zugänglich über

Offener Zugang

Dauer

00:32:00 Min

Aufnahmedatum

2022-10-17

Hochgeladen am

2022-10-17 15:26:03

Sprache

en-US

Tags

deep learning representation learning
Einbetten
Wordpress FAU Plugin
iFrame
Teilen